Goto

Collaborating Authors

 linear reconstruction


Reconstructing the local density field with combined convolutional and point cloud architecture

Barthe-Gold, Baptiste, Nguyen, Nhat-Minh, Thiele, Leander

arXiv.org Machine Learning

We construct a neural network to perform regression on the local dark-matter density field given line-of-sight peculiar velocities of dark-matter halos, biased tracers of the dark matter field. Our architecture combines a convolutional U-Net with a point-cloud DeepSets. This combination enables efficient use of small-scale information and improves reconstruction quality relative to a U-Net-only approach. Specifically, our hybrid network recovers both clustering amplitudes and phases better than the U-Net on small scales.


Generative Locally Linear Embedding

Ghojogh, Benyamin, Ghodsi, Ali, Karray, Fakhri, Crowley, Mark

arXiv.org Machine Learning

Locally Linear Embedding (LLE) is a nonlinear spectral dimensionality reduction and manifold learning method. It has two main steps which are linear reconstruction and linear embedding of points in the input space and embedding space, respectively. In this work, we propose two novel generative versions of LLE, named Generative LLE (GLLE), whose linear reconstruction steps are stochastic rather than deterministic. GLLE assumes that every data point is caused by its linear reconstruction weights as latent factors. The proposed GLLE algorithms can generate various LLE embeddings stochastically while all the generated embeddings relate to the original LLE embedding. We propose two versions for stochastic linear reconstruction, one using expectation maximization and another with direct sampling from a derived distribution by optimization. The proposed GLLE methods are closely related to and inspired by variational inference, factor analysis, and probabilistic principal component analysis. Our simulations show that the proposed GLLE methods work effectively in unfolding and generating submanifolds of data.


Locally Linear Embedding and its Variants: Tutorial and Survey

Ghojogh, Benyamin, Ghodsi, Ali, Karray, Fakhri, Crowley, Mark

arXiv.org Machine Learning

This is a tutorial and survey paper for Locally Linear Embedding (LLE) and its variants. The idea of LLE is fitting the local structure of manifold in the embedding space. In this paper, we first cover LLE, kernel LLE, inverse LLE, and feature fusion with LLE. Then, we cover out-of-sample embedding using linear reconstruction, eigenfunctions, and kernel mapping. Incremental LLE is explained for embedding streaming data. Landmark LLE methods using the Nystrom approximation and locally linear landmarks are explained for big data embedding. We introduce the methods for parameter selection of number of neighbors using residual variance, Procrustes statistics, preservation neighborhood error, and local neighborhood selection. Afterwards, Supervised LLE (SLLE), enhanced SLLE, SLLE projection, probabilistic SLLE, supervised guided LLE (using Hilbert-Schmidt independence criterion), and semi-supervised LLE are explained for supervised and semi-supervised embedding. Robust LLE methods using least squares problem and penalty functions are also introduced for embedding in the presence of outliers and noise. Then, we introduce fusion of LLE with other manifold learning methods including Isomap (i.e., ISOLLE), principal component analysis, Fisher discriminant analysis, discriminant LLE, and Isotop. Finally, we explain weighted LLE in which the distances, reconstruction weights, or the embeddings are adjusted for better embedding; we cover weighted LLE for deformed distributed data, weighted LLE using probability of occurrence, SLLE by adjusting weights, modified LLE, and iterative LLE.


Locally Linear Image Structural Embedding for Image Structure Manifold Learning

Ghojogh, Benyamin, Karray, Fakhri, Crowley, Mark

arXiv.org Machine Learning

Most of existing manifold learning methods rely on Mean Squared Error (MSE) or $\ell_2$ norm. However, for the problem of image quality assessment, these are not promising measure. In this paper, we introduce the concept of an image structure manifold which captures image structure features and discriminates image distortions. We propose a new manifold learning method, Locally Linear Image Structural Embedding (LLISE), and kernel LLISE for learning this manifold. The LLISE is inspired by Locally Linear Embedding (LLE) but uses SSIM rather than MSE. This paper builds a bridge between manifold learning and image fidelity assessment and it can open a new area for future investigations.


Document Summarization Based on Data Reconstruction

He, Zhanying (Zhejiang University) | Chen, Chun (Zhejiang University) | Bu, Jiajun (Zhejiang University) | Wang, Can (Zhejiang University) | Zhang, Lijun (Zhejiang University) | Cai, Deng (Zhejiang University) | He, Xiaofei (Zhejiang University)

AAAI Conferences

Document summarization is of great value to many real world applications, such as snippets generation for search results and news headlines generation. Traditionally, document summarization is implemented by extracting sentences that cover the main topics of a document with a minimum redundancy. In this paper, we take a different perspective from data reconstruction and propose a novel framework named Document Summarization based on Data Reconstruction (DSDR). Specifically, our approach generates a summary which consist of those sentences that can best reconstruct the original document. To model the relationship among sentences, we introduce two objective functions: (1) linear reconstruction, which approximates the document by linear combinations of the selected sentences; (2) nonnegative linear reconstruction, which allows only additive, not subtractive, linear combinations. In this framework, the reconstruction error becomes a natural criterion for measuring the quality of the summary. For each objective function, we develop an efficient algorithm to solve the corresponding optimization problem. Extensive experiments on summarization benchmark data sets DUC 2006 and DUC 2007 demonstrate the effectiveness of our proposed approach.